L1-norm Kernel PCA
نویسندگان
چکیده
We present the first model and algorithm for L1-norm kernel PCA. While L2-norm kernel PCA has been widely studied, there has been no work on L1-norm kernel PCA. For this non-convex and non-smooth problem, we offer geometric understandings through reformulations and present an efficient algorithm where the kernel trick is applicable. To attest the efficiency of the algorithm, we provide a convergence analysis including linear rate of convergence. Moreover, we prove that the output of our algorithm is a local optimal solution to the L1-norm kernel PCA problem. We also numerically show its robustness when extracting principal components in the presence of influential outliers, as well as its runtime comparability to L2-norm kernel PCA. Lastly, we introduce its application to outlier detection and show that the L1-norm kernel PCA based model outperforms especially for high dimensional data.
منابع مشابه
L1-norm Principal-Component Analysis in L2-norm-reduced-rank Data Subspaces
Standard Principal-Component Analysis (PCA) is known to be very sensitive to outliers among the processed data. On the other hand, in has been recently shown that L1-norm-based PCA (L1-PCA) exhibits sturdy resistance against outliers, while it performs similar to standard PCA when applied to nominal or smoothly corrupted data. Exact calculation of the K L1-norm Principal Components (L1-PCs) of ...
متن کاملA pure L1L1-norm principal component analysis
The L1 norm has been applied in numerous variations of principal component analysis (PCA). L1-norm PCA is an attractive alternative to traditional L2-based PCA because it can impart robustness in the presence of outliers and is indicated for models where standard Gaussian assumptions about the noise may not apply. Of all the previously-proposed PCA schemes that recast PCA as an optimization pro...
متن کاملAn efficient algorithm for L1-norm principal component analysis
Principal component analysis (PCA) (also called Karhunen Loève transform) has been widely used for dimensionality reduction, denoising, feature selection, subspace detection and other purposes. However, traditional PCA minimizes the sum of squared errors and suffers from both outliers and large feature noises. The L1-norm based PCA (more precisely L1,1 norm) is more robust. Yet, the optimizatio...
متن کاملL1-norm-based (2D)PCA
Traditional bidirectional two-dimension (2D) principal component analysis ((2D)PCA-L2) is sensitive to outliers because its objective function is the least squares criterion based on L2-norm. This paper proposes a simple but effective L1-norm-based bidirectional 2D principal component analysis ((2D)PCA-L1), which jointly takes advantage of the merits of bidirectional 2D subspace learning and L1...
متن کاملMultiple Kernel Support Vector Regression with Higher Norm in Option Pricing
The purpose of present study is to investigate a nonparametric model that improves accuracy of option prices found by previous models. In this study option prices are calculated using multiple kernel Support Vector Regression with different norm values and their results are compared. L1norm multiple kernel learning Support Vector Regression (MKLSVR) has been successfully applied to option price...
متن کاملذخیره در منابع من
با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید
برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید
ثبت ناماگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید
ورودعنوان ژورنال:
- CoRR
دوره abs/1709.10152 شماره
صفحات -
تاریخ انتشار 2017